data center knowledge
Top Three Use Cases for AI in Cybersecurity
Cybersecurity professionals are facing an unprecedented threat environment, with record-high numbers of attacks, shortage of qualified staff, and increasing aggression and sophistication from nation-state actors. For many data center cybersecurity managers, the silver bullet for all these problems is artificial intelligence. It promises to allow security teams to handle more threats than ever before, of greater complexity than ever before, with fewer and fewer people. In fact, according to a global survey released this past September by Pillsbury, a global law firm focusing on technology, 49% of executives think artificial intelligence is the best tool to counter nation-state cyber attacks. Pillsbury predicts that cybersecurity-related AI spending will increase at a compound annual growth rate of 24% through 2027 to reach a market value of $46 billion.
- Information Technology > Security & Privacy (1.00)
- Government > Military > Cyberwarfare (1.00)
Getting Rid of the Deep Learning Silo in the Data Center
With an electrical engineering education from Purdue University (PhD) and the Indian Institute of Technology Bombay (BS, MS), and nearly 25 years of experience in the semiconductor, systems, and hyperscale service provider industries, he has a broad perspective on hardware design and deployment. The elasticity of cloud infrastructure is a key enabler for enterprises and internet services, creating a shared pool of compute resources that various tenants can draw from as their workloads ebb and flow. Cloud tenants are spared the details of capacity and supply planning. This worked well because modern server systems are very efficient at a multitude of general computing tasks. Deep learning, however, creates new complexities for this model.
- Information Technology > Services (0.93)
- Education > Curriculum > Subject-Specific Education (0.56)
Stepping Stones to Artificial Intelligence in Banking
Neil Barton is the Chief Technology Officer for WhereScape. The banking industry is ripe for disruption. Startup banks are challenging the traditional monolithic financial institutions to find more agile ways of working, to be smarter and do more with less. Artificial Intelligence (AI) can be an attractive prospect but deploying such an advanced technology is not a plug-and-play scenario. The reality of the situation is that, many banks are still at an interim stage when it comes to Al.
- Information Technology > Security & Privacy (1.00)
- Banking & Finance (1.00)
Three Ways AI Will Transform IT Service Management
Doron Gordon is CEO and Founder of Samanage. In the quest for smarter and faster services, IT departments are pioneers in deploying new methods and processes to improve internal service delivery. In the next 12 months, artificial intelligence (AI) will start driving new breakthrough features in service management that will result in unprecedented efficiencies for IT departments and organizations. By moving service management to the cloud, IT teams have shifted from primarily handling break-fix tickets to building comprehensive service catalogs to help empower employees to get work done faster and more effectively. Now, the next wave of disruption is driven by AI and fueled by unprecedented access to data insights that are available thanks to cloud services.
The Year of Automation and Intelligence for Hyperconverged Systems
Bruce Milne is Vice President and CMO at Pivot3. Another year ends, another round of predictions begins. Looking back to the widespread adoption of virtualization and blade consolidation of the mid-2000s – which reduced and simplified IT environments and paved the way for converged infrastructure – it's clear we've come a long way. Hyperconverged infrastructure (HCI) hardware-based appliances came next, and have since evolved into more software-defined infrastructure platforms. This has transformed HCI into becoming the platform for supporting hybrid cloud mobility and autonomous workload acceleration.
An Industry of Innovation
Tate Cantrell is CTO of Verne Global. Let's break down the acronym ICT: Information and Communication Technology. ICT as a sector and as an industry was created to enable efficient sharing of information. Improved communication between the workers that make up our economies has generated huge returns in terms of GDP growth around the world. No longer is the technology stack enabling information sharing, the technology stack is now creating the information.
Report: AI Tells AWS How Many Servers to Buy and When Data Center Knowledge
Internet giants Google, Microsoft, Amazon, and Facebook use Machine Learning to enhance their services for end users, such as real-time search suggestions, face recognition in photos, voice commands, or cloud services for software developers, but they also use Artificial Intelligence to optimize their internal operations. Google revealed in 2014 that it uses Machine Learning to improve energy efficiency of its data centers, and Amazon's use of AI to manage warehouses for its e-commerce business hasn't been a secret since at least 2015. So, it comes as no surprise that Amazon Web Services, the company's cloud services arm, also applies Machine Learning to one of the toughest puzzles in data center management: capacity planning. AWS uses Machine Learning to forecast cloud data center capacity demand and to figure out where on the planet to store additional data center components so that it can expand capacity quickly where and when it's needed. AWS CEO Andy Jassy revealed the practice in front of an audience at this week's Foundations of Science Breakfast by the Pacific Science Center, GeekWire reported.
NVIDIA CEO: AI Workloads Will "Flood" Data Centers Data Center Knowledge
During a keynote at his company's big annual conference in Silicon Valley last week, NVIDIA CEO Jensen Huang took several hours to announce the chipmaker's latest products and innovations, but also to drive home the inevitability of the force that is Artificial Intelligence. NVIDIA is the top maker of GPUs used in computing systems for Machine Learning, currently the part of the AI field where most action is happening. GPUs work in tandem with CPUs, accelerating the processing necessary to both train machines to do certain tasks and to execute them. "Machine Learning is one of the most important computer revolutions ever," Huang said. "The number of [research] papers in Deep Learning is just absolutely explosive."
- North America > United States > Virginia > Loudoun County > Ashburn (0.05)
- North America > United States > California > San Diego County > San Diego (0.05)
- North America > Canada > Ontario > Toronto (0.05)
How Machine Learning Can Improve IT Operations Data Center Knowledge
Bostjan Kaluza, Phd, is Chief Data Scientist at Evolven Software. IT operations teams often focus on more than one approach to infrastructure monitoring, such as device, network, server, application and storage, with the implication that the whole is equal to the sum of its parts. According to a 2015 Application Performance Monitoring survey, 65 percent of surveyed companies own more than 10 different monitoring tools. Despite the increase in instrumentation capabilities and the amount of collected data, enterprises barely use significantly larger data sets to improve availability and performance process effectiveness with root cause analysis and incident prediction. W. Cappelli (Gartner, October 2015) emphasizes in a recent report that "although availability and performance data volumes have increased by an order of magnitude over the last 10 years, enterprises find data in their possession insufficiently actionable … Root causes of performance problems have taken an average of 7 days to diagnose, compared to 8 days in 2005 and only 3 percent of incidents were predicted, compared to 2 percent in 2005".
Artificial Intelligence and the Evolution of Data Centers Data Center Knowledge
Charles-Antoine Beyney is Co-Founder and CEO of Etix Everywhere. Data centers are proliferating to meet the relentless demand for IT capacity and seeking greater efficiency everyday, and each new innovation is a major step. To meet these requirements, Artificial Intelligence (AI) has arrived, holding tremendous promise for the industry. Facility administrators and IT managers have several critical objectives for their data center operations, but none are as important as uptime and energy efficiency. According to 2016 research by the Ponemon Institute, the average cost of a single data center outage today is approximately $730,000.